Maximum-entropy from the probability calculus: exchangeability, sufficiency

نویسنده

  • P.G.L. Porta Mana
چکیده

The classical maximum-entropy principlemethod (Jaynes 1963) appears in the probability calculus as an approximation of a particular model by exchangeability or a particular model by sufficiency. The approximation from the exchangeability model can be inferred from an analysis by Jaynes (1996) and to some extent from works on entropic priors (Rodríguez 1989; 2002; Skilling 1989a; 1990). I tried to show it explicitly in a simple context (Porta Mana 2009). The approximation from the sufficiency model can be inferred from Bernardo & Smith (2000 § 4.5) and Diaconis & Freedman (1981) in combination with the KoopmanPitman-Darmois theorem (see references in § 3). In this note I illustrate how either approximations arises, in turn, and then give a heuristic synopsis of both. At the end I discuss some questions: Prediction or retrodiction? Which of the two models is preferable? (the exchangeable one.) How good is the maximum-entropy approximation? Is this a “derivation” of maximum-entropy? I assume that you are familiar with: the maximum-(relative-)entropy method (Jaynes 1957a; much clearer in Jaynes 1963; Sivia 2006; Hobson et al. 1973), especially the mathematical form of its distributions and its prescription “expectations = empirical averages”; the probability calculus (Jaynes 2003; Hailperin 1996; Jeffreys 2003; Lindley 2014); the basics of models by exchangeability and sufficiency (Bernardo et al. 2000 ch. 4), although I’ll try to explain the basic ideas behind them – likely you’ve often worked with them even if you’ve never heard of them under these names.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Determination of Maximum Bayesian Entropy Probability Distribution

In this paper, we consider the determination methods of maximum entropy multivariate distributions with given prior under the constraints, that the marginal distributions or the marginals and covariance matrix are prescribed. Next, some numerical solutions are considered for the cases of unavailable closed form of solutions. Finally, these methods are illustrated via some numerical examples.

متن کامل

Reasoning with Probabilities and Maximum Entropy : The System PIT and its Application in LEXMED 1

We present a theory, a system and an application for common sense reasoning based on propositional logic, the probability calculus and the concept of maximum entropy. The task of the system Pit (Probability Induction Tool) is to provide decisions under incomplete knowledge, while keeping the necessary additional assumptions as minimal and clear as possible. We therefore enrich the probability c...

متن کامل

Tractability through Exchangeability: A New Perspective on Efficient Probabilistic Inference

Exchangeability is a central notion in statistics and probability theory. The assumption that an infinite sequence of data points is exchangeable is at the core of Bayesian statistics. However, finite exchangeability as a statistical property that renders probabilistic inference tractable is less well-understood. We develop a theory of finite exchangeability and its relation to tractable probab...

متن کامل

Tractability through Exchangeability: A New Perspective on Efficient Probabilistic Inference [Highlight on Published Work]

Exchangeability is a central notion in statistics and probability theory. The assumption that an infinite sequence of data points is exchangeable is at the core of Bayesian statistics. However, finite exchangeability as a statistical property that renders probabilistic inference tractable is less well-understood. We develop a theory of finite exchangeability and its relation to tractable probab...

متن کامل

Introduction to Bayesian Inference

These are the write-up of a NIKHEF topical lecture series on Bayesian inference. The topics covered are the definition of probability, elementary probability calculus and assignment, selection of least informative probabilities by the maximum entropy principle, parameter estimation, systematic error propagation, model selection and the stopping problem in counting experiments.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017